1 Training Neural Networks Beyond the Euclidean Distance
ثبت نشده
چکیده
When committed to forecasting or classification tasks Neural Networks (NNs) are typically trained with respect to Euclidean distance minimisation / likelihood maximisation. This is commonly irrespective of any other end user preferences. In a number of situations, most notably time series forecasting, users may have other objectives than solely Euclidean minimisation. Users may, for example, desire model predictions be consistent in their residual error, be accurate in their directional change predictions, or may prefer other error measures than just Euclidean distance. In this paper a methodology is devised for training Multi Layer Perceptrons (MLPs) with competing error objectives using an Evolutionary Strategy (ES). Experimental evidence is presented, using the Santa Fe competition data, which supports the effectiveness of this training methodology. Identical networks are trained with different and competing error measures. Different solutions are achieved, and these solutions are shown in general to reflect the error measure preferences as defined during training. Areas for future research and model extensions are also discussed.
منابع مشابه
Using a Mahalanobis-Like Distance to Train Radial Basis Neural Networks
Radial Basis Neural Networks (RBNN) can approximate any regular function and have a faster training phase than other similar neural networks. However, the activation of each neuron depends on the euclidean distance between a pattern and the neuron center. Therefore, the activation function is symmetrical and all attributes are considered equally relevant. This could be solved by altering the me...
متن کاملFace Recognition Using PCA, FLDA and Artificial Neural Networks
Face recognition is a system that identifies human faces through complex computational techniques. The paper explains two different algorithms for feature extraction. These are Principal Component Analysis and Fisher Faces algorithm. It then explains how images can be recognized using a backpropagation algorithm on a feed forward neural network. Two training databases one containing 20 images a...
متن کاملEvolving Generalized Euclidean Distances for Training RBNN
In Radial Basis Neural Networks (RBNN), the activation of each neuron depends on the Euclidean distance between a pattern and the neuron center. Such a symmetrical activation assumes that all attributes are equally relevant, which might not be true. Non-symmetrical distances like Mahalanobis can be used. However, this distance is computed directly from the data covariance matrix and therefore t...
متن کاملBeyond Standard Metrics - On the Selection and Combination of Distance Metrics for an Improved Classification of Hyperspectral Data
Training and application of prototype based learning approaches such as Learning Vector Quantization, Radial Basis Function networks, and Supervised Neural Gas require the use of distance metrics to measure the similarities between feature vectors as well as class prototypes. While the Euclidean distance is used in many cases, the highly correlated features within the hyperspectral representati...
متن کاملAircraft Visual Identification by Neural Networks
In the present paper, an efficient method for three dimensional aircraft pattern recognition is introduced. In this method, a set of simple area based features extracted from silhouette of aerial vehicles are used to recognize an aircraft type from its optical or infrared images taken by a CCD camera or a FLIR sensor. These images can be taken from any direction and distance relative to the fly...
متن کامل